Logical Aspects of Computational Linguistics: An Introduction
نویسندگان
چکیده
The papers in this collection are all devoted to single theme: logic and its applications in computational linguistics. They share many themes, goals and techniques, and any editorial classification is bound to highlight some connections at the expense of other. Nonetheless, we have found it useful to divide these papers (somewhat arbitrarily) into the following four categories: logical semantics of natural language, grammar and logic, mathematics with linguistic motivations, and computational perspectives. In this introduction, we use this four-way classification as a guide to the papers, and, more generally, to the research agenda that underlies them. We hope that the reader will find it a useful starting point to the collection. 1 Logical semantics of natural language Logical semantics of natural language is a diverse field that draws on many disciplines, including philosophical and mathematical logic, linguistics, computer science, and artificial intelligence. Probably the best way to get a detailed overview is to consult the many excellent survey articles in [19]; this Handbook is likely to be the standard reference for work on logic in linguistics for some time to come. In the space of a short introduction like this, it is difficult to give an accurate impression of such a diverse area; nonetheless, by approaching the subject historically, we can at least hope to give a sense of the general direction the field is taking, and indicate how the articles on logical semantics to be found in this collection fit in with recent developments. Modern logic begins with the work of Frege, and so does logical semantics. His celebrated Begriffsschrift [50] was not only one of the first formal treatments of quantified logic, it also discussed the structure of natural language quite explicitly. Moreover, Frege’s work inspired others to attempt explicit logical treatments of natural language semantics: Carnap [25] and Reichenbach [98] are important (and still widely referenced) examples. Nonetheless, in spite of such impressive precedents, it is usual to regard the work of Montague as the starting point of the truly modern phase of formal semantics. Why is this? Montague was a highly original thinker, and his three seminal articles on natural language semantics (English as a formal language, The proper treatment of quantification in ordinary English, and Universal Grammar, all in [116]) synthesise the best of what had gone before (namely, the possible worlds semantics of Carnap [25] and Kripke [72], together with the higher order logic of Church [41]) with a potent new idea: semantic construction need not be a hit-or-miss affair that relies on intuition. Precise — and indeed, reasonably straightforward — algorithms can be given which translate interesting chunks of natural language into logical formalisms. Moreover (as Montague proved in a very general way in Universal Grammar) if the translation procedure obeys certain natural rules, the semantics of the logical formalism induces the desired semantics on the natural language input. From the perspective of the 1990s it can be difficult to appreciate how revolutionary Montague’s vision was — but this is largely because his insights are now a standard part of every working semanticist’s toolkit. To be sure, semanticists rarely work with Montague’s type theory any more, and Montague himself might well have been surprised by the wide range of semantic issues now addressed using logical methods. Nonetheless, the vision that drives modern logical semantics is essentially Montague’s: it is possible to model an important part of natural language semantics using logical tools, and the construction of semantic representations is algorithmic. Montague’s work appeared in the late 1960s and the early 1970s. The 1970s were, by and large, a period during which his ideas were assimilated by the linguistic community. Thanks largely to the efforts of Partee (like [92]), Montague’s logical methods (which linguists found strange and new, and often intimidating) slowly gained acceptance. During this period, Montague’s ideas were extended to cover a wider range of linguistic phenomena; the investigations into the temporal semantics of natural language recorded in [45] by Dowty are a nice example of such work. If the key themes of the 1970s were “assimilation” and “development” the key theme of the 1980s was “diversity”. For a start, two new approaches to logical semantics were developed: Situation Semantics of Barwise and others [15] and Discourse Representation Theory of Kamp and others [68,60,69]. Like Montague semantics, Situation Semantics offered a model theoretic perspective on natural language semantics, but it changed the ground rules in seemingly radical ways. For a start, whereas Montague’s logic was total (that is, every statement had to be either true or false, and not both, at every possible pair of possible worlds and times), Barwise and Perry took seriously the idea that real situations were often “small”, partially specified, chunks of reality. Accordingly, it seemed appropriate to drop the assumption that every sentence had to be either true or false in every situation — hence situation semantics naturally leads to the use of partial logics. Furthermore, stipulating the conditions under which sentences were true was no longer to be the central explanatory idea in semantics. Rather, meaning was to be viewed as a relation between situations, a relation that emerged from the interplay of complex constraints. The agenda set by Discourse Representation Theory (DRT) seemed, at first, less revolutionary. Essentially, DRT proposed using an intermediate level of representation in a more essential way than Montague’s work did. Although Montague made use of translations into an intermediate level of logical representation, in his work this level is a convenience that is, in principle, eliminable. Kamp and Heim showed that by making heavier — indeed, intrinsic — use of this intermediate level, interesting treatments could be given of temporal phenomena, and a number of troublesome problems involving quantifier scopes could be resolved. This may sound reasonably innocuous — but actually DRT has proved to be far more genuinely subversive to received semantic ideas than Situation Semantics. For a start, the idea that an intermediate level of representation is intrinsic to meaning is both interesting and highly controversial philosophically. Moreover, from DRT stems one of key words of modern semantics: dynamics. Thinking of semantics in terms of DRT leads fairly directly to an interesting — and very different — conception of meaning: meanings as functions which update contexts with new information. This essentially computational metaphor has developed into an important perspective for looking at semantics; [58] by Groenendijk and Stokhof is a prominent example of such work. While DRT and Situation Semantics are perhaps the two key new ideas that emerged in the 1980s, a number of other themes emerged then that continue to play an important role. For a start, linguists slowly became aware of other traditions in logical semantics (the game theoretic approach instigated by Hintikka is perhaps the most important example, see [105]). Moreover, perhaps for the first time, semantic investigations began feeding important ideas and questions back into logic itself; the development of generalised quantifier theory by Barwise and Cooper or van Benthem [14,16] is an important example of this. So where do we stand in the 1990s? Here the key theme seems to be “integration”. At the logical level, there seems to be an emerging consensus that what is needed are frameworks in which the insights gathered in previous work can be combined in a revealing way. For example, work in Situation Semantics and DRT has made clear that ideas such as partiality and dynamism are important — but are there well-behaved logical systems which capture such ideas? Some interesting answers are emerging here: for example, Muskens in [90] shows how the insights concerning partiality and situations can be integrated into classical type theories that are simpler (and far more elegant) than Montague’s, while Ranta in [97] shows that constructive type theory offer a uniform logical setting for exploring many current problems in logical semantics, including those raised by DRT. A second theme that is emerging is the use of inference as a tool for solving semantic puzzles. This is a natural step to take: one of the advantage of doing semantics in well understood formalisms (such as various kinds of type theory) is that the inference mechanisms these formalisms possess offer plausible mechanisms for modeling further phenomena. A third — and crucial — theme is how best to link all this semantical and logical work with the new and sophisticated approaches to syntax (and phonology) that have been developed over the fifteen or so years. The syntax-semantics interface is likely to be a focus of research in coming years. Bearing these general remarks in mind, let’s consider the individual contributions. Sloppy identity by Claire GARDENT, exploits for semantic purposes the existence of Higher Order Unification (HOU) algorithms for classical type theory. The key linguistic idea is that a wide variety of sloppy interpretation phenomena can be seen as arising from a semantic constraint on parallel structures. As the author shows, by making use of HOU — essentially a powerful pattern-matching mechanism — a precise theory of sloppy interpretation can be given for such parallel constructions. Among other things, this theory captures the interaction of sloppy/strict ambiguity with quantification and binding. The papers Vagueness and type theory, by Pascal BOLDINI, and A natural language explanation for formal proofs, by Yann COSCOY, are both based on constructive type theory, but they use it for very different purposes. Boldini’s paper gives an analysis of certain so-called vague predicates, such as “small”, which, in addition to creating a problem for logical semantics, have played a role in the philosophical controversy between constructive and classical logic. Coscoy’s paper is associated with one of the major computer applications of constructive type theory, proof editors. It explains an algorithm that produces natural language texts to express formal proofs, and discusses some stylistic principles to improve the texts. Coscoy’s algorithm is actually available as a part of the latest release of the proof editor Coq [12]. The paper A belief-centered treatment of pragmatic presupposition, by Lucia MANARA and Anne DE ROECK, addresses the problem of linguistic presupposition — the phenomenon that the meaningfulness of a sentence may demand the truth of some proposition. For instance, “John’s wife is away” presupposes that John has a wife. This problem is approached in terms of the logical calculus of property theory, which provides new ideas for the interference of beliefs with the phenomenon of presupposition. Whereas the previous four papers essentially exploit in various ways the fact that formal semantics can be done in standard logical systems (be it classical type theory, constructive type theory, or property theory) Language understanding: a procedural perspective, Ruth KEMPSON, Wilfried MEYER VIOL and Dov GABBAY is an attempt to view the syntax-semantics interface from a logical perspective. In particular, the authors show how an incremental parser can be formulated using the idea of labeled deductions, aided by a modal logic of finite trees. They argue that this blend of ideas can deal with crossover phenomena in a way that eludes other frameworks.
منابع مشابه
Logical Aspects of Computational Linguistics, 5th International Conference, LACL 2005, Bordeaux, France, April 28-30, 2005, Proceedings
That's it, a book to wait for in this month. Even you have wanted for long time for releasing this book logical aspects of computational linguistics 5th international conference lacl 2005 bordeaux france april 28 3
متن کاملFormal Aspects of Linguistic Modelling
These notes cover the first part of an introductory course on computational linguistics, also known as MPRI 2-27-1: Logical and computational structures for linguistic modelling. The course is subdivided into two parts: the first, which is the topic of these notes, covers grammars, automata, and logics for syntax modelling, while the second part focuses on logical approaches to semantics. Among...
متن کاملINS - R 9706 September 30 , 1997
A dual-bring of a feature-logic and a concatenation-logic is proposed, in which syntactic categorial types \live in" feature terms, in contrast to current bring, in which feature-terms \live in" types. The dual-bring contains also arrow-introduction rules for hypothetical reasoning. It is used to explain some \privileged features" in HPSG and their non-uniication manipulation. Note: Paper prese...
متن کاملNotes on Computational Aspects of Syntax
These notes cover the first part of an introductory course on computational linguistics, also known as MPRI 2-27-1: Logical and computational structures for linguistic modeling. The course is subdivided into two parts: the first, which is the topic of these notes, is focused almost exclusively on syntax, while the second part, taught this year by Philippe de Groote, covers semantics and discour...
متن کاملLogical and Computational Structures for Linguistic Modelling
These notes cover the first part of an introductory course on computational linguistics, also known as MPRI 2-27-1: Logical and computational structures for linguistic modelling. The course is subdivided into two parts: the first, which is the topic of these notes, covers grammars, automata, and logics for syntax modelling, while the second part focuses on logical approaches semantics. Among th...
متن کامل"Language And Computers": Creating An Introduction For A General Undergraduate Audience
This paper describes the creation of Language and Computers, a new course at the Ohio State University designed to be a broad overview of topics in computational linguistics, focusing on applications which have the most immediate relevance to students. This course satisfies the mathematical and logical analysis requirement at Ohio State by using natural language systems to motivate students to ...
متن کامل